Mean square convergence analysis for kernel least mean square algorithm

نویسندگان

  • Badong Chen
  • Songlin Zhao
  • Pingping Zhu
  • José Carlos Príncipe
چکیده

In this paper, we study the mean square convergence of the kernel least mean square (KLMS). The fundamental energy conservation relation has been established in feature space. Starting from the energy conservation relation, we carry out the mean square convergence analysis and obtain several important theoretical results, including an upper bound on step size that guarantees the mean square convergence, the theoretical steady-state excess mean square error (EMSE), an optimal step size for the fastest convergence, and an optimal kernel size for the fastest initial convergence. Monte Carlo simulation results agree with the theoretical analysis very well.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Analytical Model for Predicting the Convergence Behavior of the Least Mean Mixed-Norm (LMMN) Algorithm

The Least Mean Mixed-Norm (LMMN) algorithm is a stochastic gradient-based algorithm whose objective is to minimum a combination of the cost functions of the Least Mean Square (LMS) and Least Mean Fourth (LMF) algorithms. This algorithm has inherited many properties and advantages of the LMS and LMF algorithms and mitigated their weaknesses in some ways. The main issue of the LMMN algorithm is t...

متن کامل

Kernel least mean square with adaptive kernel size

Kernel adaptive filters (KAF) are a class of powerful nonlinear filters developed in Reproducing Kernel Hilbert Space (RKHS). The Gaussian kernel is usually the default kernel in KAF algorithms, but selecting the proper kernel size (bandwidth) is still an open important issue especially for learning with small sample sizes. In previous research, the kernel size was set manually or estimated in ...

متن کامل

Convergence Analysis of Proportionate-type Least Mean Square Algorithms

In this paper, we present the convergence analysis of proportionate-type least mean square (Pt-LMS) algorithm that identifies the sparse system effectively and more suitable for real time VLSI applications. Both first and second order convergence analysis of Pt-LMS algorithm is studied. Optimum convergence behavior of Pt-LMS algorithm is studied from the second order convergence analysis provid...

متن کامل

Online dictionary learning for kernel LMS Analysis and forward-backward splitting algorithm

Adaptive filtering algorithms operating in reproducing kernel Hilbert spaces have demonstrated superiority over their linear counterpart for nonlinear system identification. Unfortunately, an undesirable characteristic of these methods is that the order of the filters grows linearly with the number of input data. This dramatically increases the computational burden and memory requirement. A var...

متن کامل

Steady-state Performance of Incremental LMS Strategies For Parameter Estimation Over Fading Wireless Channels

We study the effect of fading in the communication channels between nodes on the performance of the incremental least mean square (ILMS) algorithm. We derive steadystate performance metrics, including the mean-square deviation (MSD), excess mean-square error (EMSE), and mean-square error (MSE). We obtain the sufficient conditions to ensure meansquare convergence, and verify our results through ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Signal Processing

دوره 92  شماره 

صفحات  -

تاریخ انتشار 2012